205 research outputs found

    Empathy, Challenge, and Psychophysiological Activation in Therapist-Client Interaction

    Get PDF
    Two central dimensions in psychotherapeutic work are a therapist's empathy with clients and challenging their judgments. We investigated how they influence psychophysiological responses in the participants. Data were from psychodynamic therapy sessions, 24 sessions from 5 dyads, from which 694 therapist's interventions were coded. Heart rate and electrodermal activity (EDA) of the participants were used to index emotional arousal. Facial muscle activity (electromyography) was used to index positive and negative emotional facial expressions. Electrophysiological data were analyzed in two time frames: (a) during the therapists' interventions and (b) across the whole psychotherapy session. Both empathy and challenge had an effect on psychophysiological responses in the participants. Therapists' empathy decreased clients' and increased their own EDA across the session. Therapists' challenge increased their own EDA in response to the interventions, but not across the sessions. Clients, on the other hand, did not respond to challenges during interventions, but challenges tended to increase EDA across a session. Furthermore, there was an interaction effect between empathy and challenge. Heart rate decreased and positive facial expressions increased in sessions where empathy and challenge were coupled, i.e., the amount of both empathy and challenge was either high or low. This suggests that these two variables work together. The results highlight the therapeutic functions and interrelation of empathy and challenge, and in line with the dyadic system theory by Beebe and Lachmann (2002), the systemic linkage between interactional expression and individual regulation of emotion.Peer reviewe

    The Human Auditory Sensory Memory Trace Persists about 10 sec: Neuromagnetic Evidence

    Get PDF
    Neuromagnetic responses were recorded to frequent "standard tones of l000 Hz and to infrequent 1100-Hz "deviant" tones with a 24-channel planar SQUID gradiometer. Stimuli were presented at constant interstimulus intervals (ISIs) ranging from 0.75 to 12 sec. The standards evoked a prominent 100-msec response, N100m, which increased in amplitude with increasing ISI. N100m could be dissociated into two subcomponents with different source areas. The posterior component, N100m2, increased when the ISI grew up to 6 sec, whereas the more anterior component, N100m2, probably continued its growth beyond the 12-sec ISI. At ISIs from 0.75 to 9 sec, the deviants elicited additionally a mismatch field (MMF). The equivalent sources of both N100m and MMF were at the supra-temporal auditory cortex. We assume that auditory stimuli leave in the auditory system a trace that affects the processing of the subsequent stimuli. The decrement of the N100m amplitude as well as elicitation of MMF can be considered as indirect evidence of active traces. A behavioral estimate of the persistence of the sensory auditory memory was obtained in a separate experiment in which the subject compared, without attending to the stimuli, tones presented at the daerent ISIs. The subjects discriminated the stimuli better than merely by chance at ISIs of 0.75-9 sec. The ISI dependence of the behavioral estimate as well as of N100m2 and MMF are similar enough to suggest a common underlying mechanism that retains information for a period of about 10 sec.Peer reviewe

    Sequentiality, Mutual Visibility, and Behavioral Matching : Body Sway and Pitch Register During Joint Decision Making

    Get PDF
    We studied behavioral matching during joint decision making. Drawing on motion-capture and voice data from 12 dyads, we analyzed body-sway and pitch-register matching during sequential transitions and continuations, with and without mutual visibility. Body sway was matched most strongly during sequential transitions in the conditions of mutual visibility. Pitch-register matching was higher during sequential transitions than continuations only when the participants could not see each other. These results suggest that both body sway and pitch register are used to manage sequential transitions, while mutual visibility influences the relative weights of these two resources. The conversational data are in Finnish with English translation.Peer reviewe

    Social pleasures of music

    Get PDF
    Humans across all societies engage in music-listening and making, which they find pleasurable, despite music does not appear to have any obvious survival value. Here we review the recent studies on the social dimensions of music that contribute to music-induced hedonia. Meta-analysis of neuroimaging studies shows that listening to both positively and negatively valenced music elicit largely similar activation patterns. Activation patterns found during processing of social signals and music are also remarkably similar. These similarities may reflect the inherent sociability of music, and the fact that musical pleasures are consistently associated with autobiographical events linked with musical pieces. Brain’s mu-opioid receptor (OR) system governing social bonding also modulates musical pleasures, and listening to and making of music increase prosociality and OR activity. Finally, real or simulated interpersonal synchrony signals affiliation, and accordingly music-induced movements increase social closeness and pleasant feelings. We conclude that these links between music and interpersonal affiliation are an important mechanism that makes music so rewarding.</p

    EEG-Based Brain-Computer Interface for Tetraplegics

    Get PDF
    Movement-disabled persons typically require a long practice time to learn how to use a brain-computer interface (BCI). Our aim was to develop a BCI which tetraplegic subjects could control only in 30 minutes. Six such subjects (level of injury C4-C5) operated a 6-channel EEG BCI. The task was to move a circle from the centre of the computer screen to its right or left side by attempting visually triggered right- or left-hand movements. During the training periods, the classifier was adapted to the user's EEG activity after each movement attempt in a supervised manner. Feedback of the performance was given immediately after starting the BCI use. Within the time limit, three subjects learned to control the BCI. We believe that fast initial learning is an important factor that increases motivation and willingness to use BCIs. We have previously tested a similar single-trial classification approach in healthy subjects. Our new results show that methods developed and tested with healthy subjects do not necessarily work as well as with motor-disabled patients. Therefore, it is important to use motor-disabled persons as subjects in BCI development

    Emotion, psychophysiology, and intersubjectivity

    Get PDF
    Publisher Copyright: © 2021 John Benjamins Publishing Company.Conversation analytical studies on emotion show how expression of emotion is part of the intersubjective experience. Emotions, however, are as much physiological as experiential events. Physiological processes pertaining to emotion involve changes in cardiovascular activity, in the activation of sweat glands, and in muscular activity. The dyadic systems theory by Beebe and Lachmann (2002) suggests that actions that regulate social interaction also serve in the regulation of internal emotional states of interacting subjects. Drawing from this theory, our overall research questions was: how is the expression of emotion in social interaction linked to physiological responses in the participants? Our main result was that thorough conversational affiliation, the participants share the emotional load in the interaction.Peer reviewe

    Gaze-Direction-Based MEG Averaging During Audiovisual Speech Perception

    Get PDF
    To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject's gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction. The right-hemisphere 100-ms response to the onset of the second vowel (N100m’) was a fifth smaller to incongruent than congruent stimuli. The results demonstrate the feasibility of realistic viewing conditions with gaze-based averaging of MEG signals
    corecore